Biden and Trump Both Lost This Week - Reading the portents of the off-year elections. - link
The Meta Narrative: What We’ve Learned from the Facebook Papers - Thousands of documents uncovered by Frances Haugen reveal, among other things, how employees of Facebook—or Meta, as it’s now known—talk when they think that no one is listening. - link
Is This the Worst Place to Be Poor and Charged with a Federal Crime? - The Southern District of Georgia does remarkably little to provide for indigent defendants. - link
The Trial of Kyle Rittenhouse Begins with Gruesome Videos and a Plea for Fact-Finding - The rifle-wielding teen-ager killed two men and grievously wounded a third during racial-justice protests in Kenosha, Wisconsin. - link
Why Was the New Jersey Gubernatorial Race So Close? - A pollster for Governor Phil Murphy explains how Republicans nearly pulled off an upset in the Garden State. - link
Can Facebook be redeemed? Twelve leading experts share bold solutions to the company’s urgent problems.
Facebook is broken, and after a recent deluge of damning internal company leaks to the press and Congress, the world has unassailable proof of how troubled it really is.
Almost 2 billion people around the world use a product owned by Meta (formerly called Facebook), including WhatsApp and Instagram, every day. For many of its users, the nearly $1 trillion valuation company is the internet and their primary platform for communication and information. Millions of us are dependent on its products in one way or another.
So what can be done to fix Facebook? Or is it past the point of fixing?
The documents leaked by employee whistleblower Frances Haugen, which were first reported by the Wall Street Journal in late September, revealed a host of problems: how Facebook-owned Instagram can be detrimental to teenagers’ mental health, how the company struggled to contain erroneous anti-vaccine Covid-19 content posted by its users, and how political extremism spread on the platform leading up to the January 6 Capitol riot. The documents Haugen leaked also showed that Facebook was seemingly aware of serious harms caused by its products, but in many cases failed to sufficiently address them.
In a statement, Facebook spokesperson Drew Pusateri responded in part: “We take steps to keep people safe even if it impacts our bottom line. To say we turn a blind eye to feedback ignores these investments, which includes the over $5 billion we’re on track to spend this year alone on safety and security, as well as the 40,000 people working on these issues at Facebook.”
For years, Congress has debated how and if it should regulate Facebook and other major social media products like Twitter, TikTok, Snap, and Google-owned YouTube. Outside researchers have been raising concerns about how the potentially grave long-term consequences of these platforms may be harming society at large. American users across the political spectrum have become increasingly suspicious of Big Tech. And even Facebook itself has said it welcomes regulation (while at the same time saying it’s against some regulation efforts, like strengthening antitrust laws). But so far, federal bills to regulate privacy, competition, or other aspects of social media businesses have gone nowhere.
Now, the gravity of the new reporting about Facebook — particularly the research about Instagram’s harm to teenagers — is leading many Republicans and Democrats to agree that even if their political motivations are different, something must be done to rein in Facebook.
And it’s not just Congress that’s thinking about Facebook’s problems and how to deal with them, it’s also social scientists, the company’s former and current employees, policy experts, and the many people who use its services.
Even Facebook says it is seeking guidance on how to address some of its problems. The company says that, for two and a half years, it has been calling for updated regulations on its business.
“Every day, we make difficult decisions on where to draw lines between free expression and harmful speech, privacy, security, and other issues, and we use both our own research and research from outside experts to improve our products and policies,” wrote Pusateri. “But we should not be making these decisions on our own which is why for years we’ve been advocating for updated regulations where democratic governments set industry standards to which we can all adhere.”
So now is an urgent time to explore ideas old and new — inside and outside the realm of political reality — about how to confront a seemingly intractable problem: Can Facebook be fixed?
To try to answer that question, Recode interviewed 12 of the leading thinkers and leaders on Facebook today: from Sen. Amy Klobuchar, who is leading new Senate legislation to update antitrust laws for the tech sector; to Stanford Internet Observatory researcher Renee DiResta, who was one of the first researchers to study viral misinformation on the platform; to former Facebook executive Brian Boland, who was one of the few high-ranking employees at the company to speak out publicly against Facebook’s business practices.
First, most believe that Facebook can be fixed, or at least that some of its issues are possible to improve. Their ideas are wide-ranging, with some more ambitious and unexpected than others. But common themes emerge in many of their answers that reveal a growing consensus about what Facebook needs to change and a few different paths that regulators and the company itself could take to make it happen:
Other approaches proposed by interviewed experts are more incremental, like redesigning Facebook’s Groups, the part of the app that has been a breeding ground for conspiracy movements like QAnon, anti-vaccine activism, and extremist political events.
The interviews were conducted separately. In each, Recode asked, “How would you fix Facebook?” Each expert defined on their own what they believe are Facebook’s biggest problems, as well as how they would fix them. Recode then asked follow-up questions based on the interviewees’ answers. These interviews have been combined, condensed, and edited for length and clarity.
Their answers are in no way a comprehensive list of all the possible solutions to Facebook’s problems, and many of them would be difficult to achieve anytime soon. But they offer a thoughtful start during a pivotal moment, as millions of people are reconsidering the bargain they agree to each time they use the company’s products.
Sen. Amy Klobuchar (D-MN) has long been a leader in Congress calling for regulation of the social media industry, on topics from political advertising to health misinformation. In October, Sen. Klobuchar introduced a Senate antitrust bill aimed at stopping major tech platforms from using their power to unfairly disadvantage competitors. Klobuchar also is the chairwoman and top-ranking Democrat on the Senate antitrust committee.
How would you fix Facebook?
First, federal privacy law. Second, protecting kids online. Third, antitrust updates [and] law changes, to make our laws as sophisticated as the companies that are now in our economy. And then finally, doing something about the algorithms.
Can you explain what you would do in each of those areas?
People have to opt in if they want their data shared. When Apple recently gave their users a decision about whether to have their data tracked, 75 percent did not opt in. And that is what you would see across platforms, if it actually was a clear choice. Which it never is — it’s very confusing.
Secondly, protecting kids online, that would include not only expanding the protections from the Children’s Online Privacy Act.
You can’t doubt that Facebook developed an innovative product. Yes, they did. But they clearly haven’t been able to compete with the times in terms of what innovations could protect people from the problems they’re having now, like for parents that don’t want to get their kids hooked.
So my argument is that by allowing the antitrust laws to actually work and be updated, then you’re going to be able to look at some of these past mergers, like Instagram.
And here we’re not talking about “destroying” Facebook or all these dramatic words, we’re talking about looking at the industry as a whole and figuring out if we need to update our competition laws, to track everything from what’s happening with the app stores to what’s happening with the platforms when it comes to selling stuff, so that they cannot be preferencing their own content and discriminating against competitors. I believe that is one — but not the only way — using the marketplace to push innovations and responsiveness to these problems.
How would you reform Section 230?
The one where we need to do the most work to figure out while still respecting free speech is [why] they’ve got total immunity when they amplify [harmful] stuff.
I already have a bill out there to get rid of the immunity for vaccine misinformation during a health crisis, as well as one that [Sen. Mark] Warner’s (D-VA) leading with Mazie Hirono (D-HI) and myself, which is about discrimination, violent conduct, and civil rights violations and the like.
Do you think Facebook can really change with Mark Zuckerberg in charge?
Have I been impressed by how he’s handled this latest crisis? No. He went sailing and issued posts from his boat. Basically he was saying, “Yeah, we’ll look at this,” but we got a whole week of no apologies. And that’s fine. He can choose not to apologize. That’s up to him. That’s a PR decision. But I think we are beyond expecting that he’s going to make the changes or whoever’s in charge of Facebook is going to make the changes. I think it’s time for us to act.
Matt Stoller is a leading critic of monopoly power in the US economy, particularly in tech. He’s the author of the book Goliath: The 100-Year War Between Monopoly Power and Democracy.
How would you fix Facebook?
One, I would send Mark Zuckerberg to jail for securities fraud and advertising fraud. Maybe Sheryl Sandberg too, for insider trading. There, you have a cultural lawlessness, and you have to address that it’s a threat to the law. So we’ve got to start there.
They lied to advertisers around their reach. And that caused advertisers to spend more money on Facebook than they would have. And with these advertising frauds, they decided not to tell investors. [Editor’s note: Facebook has been sued by advertisers for allegedly inflating key metrics around how many of its users actually see advertisements companies pay for.]
Then, No. 2, I would break up the firm. The mergers of Instagram and WhatsApp are illegal and they should be unwound. That would create more fair competition in the social media market. And when firms compete, they usually have to compete by differentiating their product around quality. I would also break up their advertising. I would also sever Facebook’s ads subsidiary. [Editor’s note: Together with Google, Facebook’s advertising business represents a majority of all advertisements sold online in the US. Some have proposed separating these companies’ advertising business lines from their other lines of business to increase competition.]
And No. 3, set clear rules of the road for the industry around advertising. Just ban surveillance advertising. When I think about the problem, I look at it and I say, “Okay, this is a firm that has an advertising model that is based on undermining social stability.” They break the law and use legal power to fortify and protect their business model. So you have to address that. That’s the problem that I see.
Why do you think criminal liabilities for Mark Zuckerberg is a higher priority than breaking antitrust?
Antitrust or any regulatory policy is going to take several years to really go into force. And these guys just don’t care. They don’t care what the government does. They simply don’t believe that anything will affect them. And the only way to address that problem is to actually bring the problem straight to them. And that means sanctioning them personally: threatening to take away their freedom for violating the law. You have to make the stakes real.
The point here isn’t that Mark Zuckerberg is a bad guy. The point here is that you have a culture of lawlessness at the firm.
Brian Boland is one of the few former Facebook executives to publicly criticize the company for its business practices, arguing that Facebook needs to be more transparent about the proliferation of viral misinformation and other harmful content on its platform. Boland used to be a vice president of partnerships and marketing, and worked at the company for 11 years.
How would you fix Facebook?
We need to dramatically improve the safety and privacy of the platform. This breaks down into at least three things — the creation of a fully empowered regulatory body that has oversight over digital companies, reforms of Section 230, and meaningful transparency.
The one thing that Facebook could control right now is transparency. Helping society understand the harms on social media is an important step for fixing the problems. Twitter just shared research data on which political content gets more distribution on Twitter. That’s a great step where they are taking the lead.
Why is a regulatory body so important?
A regulatory body is in line with how we’ve generally worked in the United States when we’ve wanted to rein in industries that are out of control. The same way that we build building codes, that we regulate the chemicals industry. The food supply used to be unsafe, but then the FDA was created to make it safe. If you think about your car that you’re getting in every day, the National Highway Traffic Safety Administration keeps you safe by making sure the car is safe.
So the things that we need to do for digital is just like all the other regulation that we’ve done before. That still gives people the great products, right? You still have awesome cars, you still have amazing food, and there’s chemicals you use every day in your life. And the building that you’re in right now is not going to collapse. We just need to do the same thing with digital platforms and services and have that regulatory body and oversight to understand what’s harmful and broken, and then the regulatory authority to mandate fixing those things.
How would you go about making data more transparent?
I think you start to make data feeds of public data available, in the same way that you have engagement data available in CrowdTangle. But you ensure that it spans the globe and has metrics like reach and engagement and distribution, so people can see what gets recommended [and] goes viral.
Algorithms aren’t good or bad, they just promote things based on the way they’ve been initially coded, and then what they learn along the way, so it’s not like people deeply understand what algorithms do or why they do it.
What would you change in Section 230?
There are two important elements for me: including a provision for a duty of care and removing protections of what algorithms amplify. A duty-of-care provision would ensure that Section 230 doesn’t remove the responsibility of platforms to reduce harms to their customers. This wouldn’t require that every harmful act is removed, but that the platforms take meaningful steps to reduce harm.
For the second part, we can ensure that we protect people’s free speech on platforms like Facebook but actually hold the platforms accountable for what they choose to amplify. These algorithms take actions that make some speech heard far more than other speech. Facebook has control over its algorithms and should not be protected from the harms those algorithms can create.
Do you think Facebook can be reformed with Mark Zuckerberg at the helm?
There’s a chance, with strong regulatory oversight, that they’ll be forced to change — but his nature is not to move in this direction. If we want Facebook and Instagram to be responsible and safer, then I don’t think you can have him and the current leadership team leading the company.
Roger McNamee is an early Facebook investor and former adviser to Mark Zuckerberg. He famously changed his opinion of the company after he saw what he believed were serious failures in its leadership and business priorities.
How would you fix Facebook?
In my opinion, you need to have three forms of legislative relief. You need to address safety, you need to adjust privacy, and you need to address competition. If Facebook were to disappear tomorrow, 100 companies would compete to fill the void, doing all the same horrible things Facebook is doing. So whatever solutions we craft must be broad enough that they prevent that from happening.
On safety, I recommend that the government create an agency, analogous to the Food and Drug Administration, that would set guidelines for which technologies should be allowed to come to market at all, and what rules they would have to follow to create a commercial product and then to remain in the market.
How do you address privacy issues?
My mentor and friend Shoshana Zuboff said this best, which is that surveillance capitalism is as morally flawed as child labor, and should be banned for the same reason.
The starting place would be to ban any third-party use of location, health, financial, app usage, web browsing, and whatever other categories of intimate data are out there.
You used to have a relationship with Mark Zuckerberg as an early investor. Do you have any confidence that the company can be fixed under his leadership?
I think this is the wrong question, if you don’t mind my saying so. I think that the underlying issue here is, we tell CEOs that their only job is to maximize shareholder value. It used to be that you told CEOs that they had to find a balance between shareholders, employees, the communities where employees live (including the country where they live), and its customers and suppliers. They had five constituents, and now we only have one: [shareholders]. And so it’s important to recognize that a big part of what’s wrong here is that we have operated in an environment where we just applied the incorrect set of incentives to managers in any field, and Mark has just been more successful than other people in creating a product that took advantage of the complete absence of rules.
Benedict Evans is one of the tech industry’s leading analysts and thinkers on the business of social media. He is an independent analyst, and used to work for the venture capital firm Andreessen Horowitz, which was an early investor in Facebook.
Do you think anything else needs to be done to fix many of the problems Facebook is criticized for? And if so, what do you think should be done?
We are clearly on a path toward regulatory requirements around content moderation both in the EU and the UK. I don’t know how you could do that in a way that could be reconciled with the American Constitution — it sounds like a legal requirement to remove speech.
You can believe that there’s a lot of nonsense talked about Facebook and also believe that it has huge problems, isn’t on top of them, and doesn’t have the incentive structures right. But it’s amazing to me how much of the press and how many politicians completely ignore YouTube, which has almost exactly the same problems.
Why do you think breaking up Facebook is not the right response?
What is the theory for why changing who owns Instagram would stop teenage girls from looking at self-harm content, and for that content being shared and suggested? Why would the dynamics change? Such a move certainly would not make it any easier to compete with Instagram, just as making YouTube a separate company would not make it any easier to make a new video-sharing site. The network effects are internal to the product, not the ownership. It also wouldn’t change the business model.
To take an analogy from another generation, there are all sorts of problems with cars, and they kill people, but that doesn’t make it sensible to compare them with tobacco. And we can punish GM for shipping a car it knows will blow up in a low-speed rear collision, but we can’t make it stop teenage boys getting drunk and driving too fast. Not everything is an antitrust problem, and most policy problems are complicated and full of trade-offs. Tech policy isn’t any simpler than education policy or health care policy.
I often think the sloganeering around “break them up!” and indeed the new comparison of tech to tobacco is displacement: People are hunting for simple slogans and easy answers that let you avoid having to grapple with the complexity of the issues.
In the US, the cult of the First Amendment makes this even harder. The US cannot pass laws requiring social media companies to remove X or Y, whereas the UK and EU are already well on the way to passing such laws, which makes “break them up” an even stronger form of displacement — it’s what you can do as a US politician, rather than what can work.
Rep. Ken Buck (R-CO) is a leading Republican in Congress on regulating tech. He co-led the historic Congressional investigation into Big Tech and antitrust which finished last year, and has been one of the most senior members of his party to join with Democrats in bipartisan legislation to strengthen antitrust laws.
How would you fix Facebook?
The obvious dangers of the platform are that bad people can use it for evil purposes. And then there are other unintended consequences where good people use it and are harmed through no fault of their own, but just because of the psychological impact.
When there’s a study that shows that something was dangerous with a car or with a food product, there’s a recall.
Facebook should be able to recall its product and to ameliorate the damages that are done before it goes too far. And they didn’t do that. Part of it has to be a personnel issue with leadership and the failure of leadership.
What’s the personnel issue? What changes would you make there?
I think that people who were in the know and realized that there was an increase in teen suicide rates, and that there was a relationship between their product and that increase — and they continued doing what they did — should be held criminally liable.
And as a member of Congress, what can you do? What are you doing to try to hold those people responsible?
I think that the role of Congress is to examine the situation — which we did with a 16-month investigation on the antitrust subcommittee — [and] expose the problems. And obviously, we saw things from the outside that now the whistleblower has confirmed from the inside with very damaging documents.
Two, trying to fix the situation which we are in with antitrust laws, and perhaps with reforms to Section 230. [Editor’s note: Section 230 is a landmark internet law that shields social media companies from being sued for most kinds of illegal activity committed by their users]. And then No. 3, it’s really up to the executive branch to make a decision on whether there is criminal liability, civil liability, and how to proceed.
Do you think Facebook should be broken up into separate companies?
I’m not sure that breaking up Facebook from Instagram makes as much sense as having other companies that are competing with Facebook and Instagram, in trying to innovate better and, frankly, offer parents an alternative.
I’m absolutely opposed to regulation. I don’t think the government should say, “This is appropriate speech in the newspaper or on Facebook or on Twitter.” I don’t think the government should say, “This is a feature that is positive or negative.” I think we’ve got to give consumers a choice. I think we make much more rational decisions when consumers make that choice.
When someone associates the word regulation with me, they think I’m going crazy. When they associate the words “competition in the marketplace” with me, they’re thinking, “Oh, okay, now I understand.”
Do you think that Facebook can be fixed with Mark Zuckerberg at the helm?
I think he has to take full responsibility and either take himself out of the picture, and others out of the picture, or make sure that changes are made so that he’s getting better information to make better decisions. But Facebook cannot continue to exist, should not continue to exist, the way they have.
Rashad Robertson is the president of Color of Change, a civil rights advocacy group that co-led a historic advertiser boycott against Facebook last June in protest of the proliferation of hate speech on the platform.
How would you fix Facebook?
I would have Instagram and WhatsApp owned by other people. And so I would shrink it.
And I would create real consequences and liability to its business model for the harm that it causes. And I would force Facebook to actually have to pay reparations for the harm they have done to local independent media, and to all the sorts of institutions that their sort of platform has destroyed.
Do you think you’ve seen progress since you helped lead the boycott against Facebook?
At that time [of the boycott], we didn’t have any levers within the government. There was no one to ask at the White House to get involved in this. Now a year has happened and we have a Biden administration. And so my demands are not to Facebook anymore, my demands are to the Biden administration and to Congress, and to tell them that they actually have to do their jobs, that we have outsized harm being done by this platform, and they actually have to do something about it.
What would real consequences look like for Facebook?
I’m not the numbers guy, but I do think [the consequences that] we’ve seen in the past from the FTC and other places have been the equivalent of a maybe expensive night out for [Facebook]. [Editor’s note: In 2019, the FTC fined Facebook $5 billion for its privacy failures in the Cambridge Analytica scandal. While it was a record-breaking fine imposed by the FTC, it failed to hinder Facebook’s financial performance and growth.]
I think that surveillance marketing on these platforms [Editor’s note: Surveillance marketing, or surveillance capitalism, is the pejorative name for business models — such as the ones that underpin Facebook and Google — that track people’s online behavior to target specific advertisements to them] and combined with these platforms being able to have section 230, that has to end — you can’t have it both ways.
Do you think Facebook can be fixed with Mark Zuckerberg in charge?
The current leadership lacks the sort of moral integrity to be the type of problem solvers our society needs. And the sooner they deal with the structures that have allowed them to be in charge, the better for all of us. But to be clear, this moment we are in — the story will be told in generations about who Mark Zuckerberg is and what he has done. And Mark Zuckerberg will always want to play by a different set of rules. He believes he can. He’s built a system for that.
Nate Persily co-founded an academic partnership program with Facebook in 2018, called Social Science One, which aimed to give researchers who are studying the real-world effects of social media unprecedented access to otherwise private Facebook data.
In 2020, Persily resigned from the program. He has since discussed the limitations of voluntary programs like Social Science One and is calling for legislation to legally mandate companies like Facebook to share more information with outside researchers.
How would you fix Facebook?
The internet platforms have lost their right to secrecy. They simply are too big and too powerful. They cannot operate in secret like a lot of other corporations. And so they have an obligation to give access to their data to outsiders.
I have been working on this for five years. I’ve tried to do it with Facebook, and I’ve become convinced that legislation is the only answer.
And why do you think this is the first of Facebook’s problems to fix?
There is a fundamental disagreement between conventional wisdom and what the platforms are saying on any number of these issues.
That’s where the Haugen revelations are so momentous. It’s not just that you see quasi-salacious stuff about what’s happening on the platforms — it’s that you actually get a window into what they have access to and the kind of studies that they can perform. And then you start saying, “Well, look, if outsiders with public spirit had access to the data, think about what we could learn right now.”
Of course, all of this has to be done in a privacy-protected way to make sure that there’s no repeat of another Cambridge Analytica — and that’s where the devil is in the details.
Why should the average person care about Facebook being transparent with its data with researchers?
If you think that these platforms are the cause of any number of social problems stretching from anorexia to genocide, then we cannot trust their representations as to whether social media is innocent or guilty of committing these problems or contributing to these problems. And so [transparency] is a prerequisite to any kind of policy intervention in any of these areas, as well as actions by civil society. So part of it is informing governments and policymakers, but some of it is also informing us about what the dangers are on the platforms and how we can act to prevent them.
Transparency is a meta problem, if you will. It is the linchpin to studying every other problem as to the harms that social media is wreaking on society. And let me also say, we should be prepared for the possibility that when we do have access to the data that the truth is going to be not as bad as people think.
The story could be a much more complicated one than algorithms are manipulating people into doing things that they otherwise wouldn’t do.
How do you make sure that Facebook is transparent with the data?
It’s quite simple. The FTC, working with the National Science Foundation, shall develop a program for vetted researchers and research projects, and shall compel the platforms to share the data with those researchers in a privacy-protected format. The data will reside at the firms [and will] not be given over to the federal government so that we prevent another Cambridge Analytica.
It’s also not just about requiring data transparency [with researchers]. We should require [social media platforms] to disclose certain things to the public that are not privacy-dangerous. Basically, something like, “What are the most popular stories and popular links on Facebook each day?” That is not privacy-endangering.
Massachusetts Sen. Ed Markey (D) has been a key congressional voice on online privacy for children for over two decades. He co-introduced the Children’s Online Privacy Protection Act of 1998 (COPPA), a law requiring tech companies to obtain parental consent “before collecting, using, or disclosing personal information from children” under 13. Today, he’s focused on updating COPPA and making broader reforms to the tech industry.
How would you fix Facebook?
Number one, I will pass the Child Online Privacy Protection Act 2.0. I was the author of that law in 1998 that’s been used to protect children in company after company. We have to upgrade that law in order to pass a long-overdue bill of rights for kids and for teens, so that kids under 16 get the same protection as kids under 13.
I would say [we should also] ban targeted ads to children and create an online eraser button, so parents and children can tell companies to delete the troves of data that they have collected about young people. And to have a cybersecurity protection requirement for kids and teams.
Because it’s obvious that Facebook only cares about children to the extent to which they are of monetary value.
Why children’s privacy first over many issues that Facebook has, like misinformation?
Kids are uniquely vulnerable. And we adults need to make sure that their data is not being used in ways that are harmful to them.
Facebook won’t protect young people. It can’t be voluntary any longer; it does not work.
Do you think Facebook can be fixed with Mark Zuckerberg at the helm?
I think regardless of who is running Facebook, we have to put a new, tough regulatory scheme in place in the event that Mark Zuckerberg leaves and his successor has the exact same philosophy. So we can’t trust the institution. We have to trust our laws.
Do you think Facebook should be broken up?
I think that the antitrust process is something that should begin. But just breaking up Facebook won’t solve the problems that we’re discussing today. We need to pass an impressive set of laws that stop social media giants from invading our privacy.
Renée DiResta is a longtime researcher of disinformation on social networks. She advised Congress on the role of foreign influence misinformation networks in the 2016 US elections. DiResta has also been one of the first social media researchers to track how anti-vaccine content and other kinds of false or extremist content spreads through Facebook Groups.
How would you fix Facebook?
Groups are probably the most broken things on the platform today.
If I could pick one thing to really focus on in the short term, it would be more sophisticated rethinking of groups and how people are recommended groups, and how groups are evaluated for inclusion and being promoted to other people.
Why do you think fixing groups is more important than, say, what people see in their newsfeed?
Because [groups] are a very, very significant part of what you see in your feed.
QAnon came out of these groups that were recommended to people, and then they came to be places where people really felt that they had found new friends and, in a sense, that kind of insularity. They evolved into echo chambers, and the groups became deeply disruptive.
But Facebook did not appear to have sophisticated metrics for evaluating [if] what was happening within groups was healthy or not healthy. The challenge became: Once groups are formed, disbanding them is a pretty major step. Perhaps one example of this is the Stop the Steal group, which grew to several [hundreds of thousands of] people or more. [The Stop the Steal Facebook group was one of the key platforms where organizers of the January 6 Capitol riot prepared to march on Washington, DC.]
How could Facebook better curate content?
I think there are certain areas where [Groups] should largely be kept out of the recommendation engine entirely. I believe there are plenty of researchers who disagree with me, but I do believe that there are many areas where it’s not a problem to allow the content to be on site — it’s more a matter of it being amplified and pushed to new people.
[But] health misinformation actually kills people. Like, there is a non-theoretical harm that is very, very real. And that’s where I argue for certain cases being treated distinctly differently. You’re not going for six people being wrong on the internet, or at the local pub, or standing at the local corner with a bullhorn. That’s not what we’re going for. When we give people amplification, when we enable them to grow massive communities that trust in them [rather than] in authorities — which are institutions that actually have more accurate information — then we find ourselves in a situation where there are real negative impacts on real people in the real world. And so that question of, “How do we understand harms?” is actually the guiding principle that we should be using to understand, “How do we rethink curation?”
How would you fix Facebook?
I think one of the struggles with Facebook right now is just people see Mark, hear Mark, or see the name Facebook, and they just don’t trust anything that comes out of their mouths.
Are there changes in leadership at the top and fresh blood that are needed to help really give a new perspective, and really be somebody that people would listen to?
Can you talk a little bit about organizational and structural problems at Facebook?
Facebook’s such a flat company, and they want to move fast. They’re giving people employees different metrics because most of those are usually centered around growth. Then, when the Integrity team comes in and wants to make changes that might slow those numbers, you can get resistance. [The Integrity team at Facebook is responsible for assessing the misuses and unintended consequences of the platform.] Because that’s what people’s bonuses are attached to.
The tech world loves working in ones and zeros — they’re very data-driven. Data wins arguments. But the problems that the Integrity team is working on aren’t all data-centric. There’s a lot of nuance. There’s gonna be trade-offs. So if you’ve got Integrity as a whole separate team, they’re trying to go to another team and be like, “Hey, you should do this because it’s gonna produce X, Y, and Z harms.” But they’re like, “Well, that’s gonna screw up my metric, and then I’ll get a bad performance review.” So you end up pitting teams up against one another, like Integrity and Product.
How would you fix that?
There’s no structural change that’s perfect.
But is it right for Integrity to be under Growth? Should it be separate? Should it be better integrated into the product lifecycle? One of the things that came out of some of these settlements around privacy is that there are particular procedures that the company had to put into place in order to make privacy considerations from the very beginning. So are there elements of that, that need to be done with the Integrity team?
Derek Thompson writes about economics, technology, and the media. He’s been writing about Facebook for several years, and his recent piece comparing Facebook to “attention alcohol” has sparked conversations about reframing how we think about social media.
How would you fix Facebook?
One, I would treat social media the way we treat alcohol: have bans and clearer limitations on use among teenagers. And study the effects of social media on anxiety, depression, and negative social comparison. Two, I would continue to shame Facebook to edit its algorithm in a way that downshifts the emphasis on high-arousal emotions such as anger and outrage. And three, I would hire more people to focus not on misinformation in the US, but on the connection between mis- or disinformation and real-world violence in places outside the US, where real-world violence flowing from these Facebook products is a common phenomenon.
What would it mean to treat Facebook the way we treat alcohol?
The debate about Facebook is way too dichotomous. It’s between one group that says Facebook is effectively evil, and another group that says Facebook is basically no big deal. And that leaves a huge space in the middle for people to treat Facebook the same way we think about alcohol. I love alcohol. I use alcohol all the time, the same way I use social media all the time. But [with alcohol], I also understand, based on decades of research and social norms, that there are ways to overdo it.
We have a social vocabulary around [alcohol] overuse and drinking and driving. We don’t have a similar social vocabulary around social media. And social media can be very good as a social lubricant — and also dangerous as a compulsive product, as we have with alcohol. And that’s why I see them as reasonably analogous.
How would you change Facebook’s algorithm?
Facebook is both a mirror and a machine. It holds up a high-quality mirror to human behavior and shows us a reflection that includes all of human kindness, and all of human generosity, and all of human hate, and all of human conspiracy theorizing, but it is also a machine that, through the accentuation of high-arousal emotions, brings forth or elicits the most outrage and the most conspiracy theorizing and the most absurd disinformation.
We can’t fix the mirror — that would require fixing humanity. But we can fix the machine, and it’s pretty clear to me that the Facebook algorithmic machine is optimized for surfacing outrage, indignation, hate, and other high-arousal negative emotions. I would like to see more research done not only by Facebook itself but also by any government, the NIH, maybe by Stanford and Harvard, on alternative ways of organizing the world’s information that are predominantly by the hybrid distribution of high-arousal negative emotions.
Can you explain why addressing Facebook’s issues in its operations outside the US is a priority problem that you would fix, and how you would fix that?
Most tech critics are hysterically over-devoted to the problems of technology in America, when these tech companies touch billions of people outside of America. And we should spend more time thinking about their impact outside of the country where their headquarters are based. Most of Facebook’s research into its negative effects, as I understand it, is focused on the effects of Facebook in the US. But we didn’t have WhatsApp- and Facebook-inspired genocide in the US.
Rebel troops are advancing toward the Ethiopian capital of Addis Ababa.
After a year of conflict, displacement, and growing humanitarian crises, Ethiopia’s civil war entered a new phase this week after a newly formed coalition of Tigrayan rebels and other minority groups began their advance on the federal capital of Addis Ababa.
The civil war, which began in November last year, has already killed thousands and displaced millions more; the UN says there have been brutal human rights violations on all sides, including a federal blockade of badly needed humanitarian aid into the northern Tigray region.
Now, the Tigrayan People’s Liberation Front, or TPLF, has responded by marching on the Ethiopian capital, Addis Ababa, along with a number of other rebel groups that have joined with the TPLF in opposition to Ethiopian Prime Minister Abiy Ahmed. While reports vary regarding their progress toward the capital, rebel forces may be as close as about 100 miles away, a spokesperson for the Oromo Liberation Army told CNN.
TPLF representatives in Washington, DC, announced the new coalition, called the United Front of Ethiopian Federalist and Confederalist Forces, on Friday, and made clear its intention to oust Abiy.
“Time is running out for him,” Berhane Gebrekristos, a former Ethiopian ambassador to the US and a TPLF leader, said during the announcement.
The coalition represents a new chapter in the conflict; previously, the TPLF, while powerful, has nonetheless been a minority force fighting against the federal government, but the advance on Addis Ababa marks a potential shift in the conflict’s momentum.
In addition to the TPLF, there are eight other groups in the new coalition; of those, the OLA is among the most prominent.
The OLA represents the Oromo ethnic group, of which Abiy himself is a member. Previously, Abiy has enjoyed the support of the Oromo and Amhara people; however, a spate of arrests and murders of Oromo leaders and activists, in addition to economic and political marginalization, have turned many Oromo against Abiy.
The seven other factions in the opposition coalition represent other ethnic groups, of which there are about 80 in Ethiopia. Upon his election, Abiy declared his government would distribute resources and power equally; however, the coalition announced their alliance on Friday “in response to the scores of crises facing the country” under his rule.
“The next step will be to organize ourselves and totally dismantle the existing government, either by force or by negotiation … then insert a transitional government,” Mahamud Ugas Muhumed, a representative of the Somali State Resistance, one of the coalition groups, said.
But while the alliance brings together other minority groups, all of which have militant factions, it’s yet unclear how effective the coalition will be in its push to depose Abiy.
“I don’t think it will have that much of an impact,” Gedion Timothewos, Ethiopia’s attorney general and justice minister, said during an online news conference Friday, calling the coalition a “publicity stunt.”
Some external experts, however, disagree. William Davison, a senior analyst with the International Crisis Group, told the New York Times this week that the coalition indicates “that the political tides are changing” in Ethiopia.
If nothing else, the coalition’s advance toward Addis Ababa appears to be causing anxiety in the capital. OLA forces report that they are about 100 miles from Addis Ababa, while other sources put the figure at 200 miles. The coalition also says it has been able to take strategic cities on the way. While the Abiy government says the rebels are exaggerating their victories, the government has also called on retired soldiers to pick up their weapons and fight the advancing forces.
“Dying for Ethiopia is a duty for all of us,” Abiy said last week.
The past month has seen further escalation of the civil war between the Tigrayan forces and Abiy’s administration, which started a year ago when TPLF forces launched what they described as a preemptive strike against a federal military base in Tigray.
Although the Tigrayan people are a minority ethnic group in Ethiopia, with a population of roughly 6 million concentrated in the northern state of Tigray, they emerged as a powerful force in the 1970s against Ethiopia’s Marxist military dictatorship, after years of marginalization.
Eventually, the TPLF dominated the coalition Ethiopian People’s Revolutionary Democratic Front, which overthrew the dictatorship in 1991. Tigrayan politicians led the government and dominated the coalition for nearly 30 years, overseeing economic growth despite famine and conflict in the region.
But the TPLF-led government was also known for torturing detainees and harsh crackdowns on dissent; by the time Abiy emerged on the political scene, anti- government protests had forced the previous prime minister to step down.
In the space of just three years, Abiy purged Tigrayan leadership from the federal government, essentially stripping the group of much of its former political power at a national level.
However, the state of Tigray is still under TPLF control, despite Abiy’s best efforts to centralize power. The TPLF’s resistance to the Abiy government — most notably by holding regional parliamentary elections in September 2020, despite Abiy’s decision to postpone them throughout the country — quickly spiraled into a full-blown conflict.
After early victories for Ethiopia’s national armed forces, the TPLF has been able to push back, recapturing the Tigrayan capital Mekele in June. Now, the group is gaining ground — as well as political and military support — with the new coalition.
But the recent developments have come at great cost for Tigray, as thousands have died in the conflict and 2 million have been displaced, both internally and externally.
A November 3 report from the UN — the most comprehensive thus far in the conflict — also detailed numerous human rights abuses on both sides of the conflict. According to Michelle Bachelet, the UN high commissioner for human rights, Ethiopian national forces, along with their Eritrean allies, were responsible for the bulk of those atrocities.
The war has also created a humanitarian crisis: Aid to Tigray slowed to a trickle in July due to a government blockade which prevented trucks with food, medical supplies, and fuel from entering the region. In September, UN humanitarian aid chief Martin Griffiths warned Tigray was on the verge of famine and called the crisis “a stain on our conscience” in an interview with the Associated Press.
Shortly thereafter, the Ethiopian government moved to expel seven UN officials from the country, accusing them of “meddling” in the nation’s affairs and diverting humanitarian aid to TPLF forces. The government had previously ordered the Dutch contingent of Médecins Sans Frontières (Doctors Without Borders), as well as the Norwegian Refugee Council, to cease operations.
In October, Ethiopian national defense forces began conducting airstrikes in Mekele, which have killed multiple children. While Abiy’s government has claimed it was targeting military installations, some witnesses say the airstrikes actually hit civilian targets. The central government denied intentionally targeting civilians, according to Reuters, and communications blackouts in the region make verification difficult.
So far, Western leaders have made statements calling for an end to hostilities and warning of the dire humanitarian situation, but now are beginning to back up their admonitions with real consequences for Ethiopia’s central government.
Notably, the US suspended Ethiopia from the African Growth and Opportunity Act, or AGOA, on Tuesday “for gross violations of internationally recognized human rights.”
AGOA allows certain countries to export goods duty-free to the US, and suspending Ethiopia’s participation will be a serious economic blow; last year, Ethiopia shipped $245 million worth of goods to the US under AGOA, according to Al Jazeera — nearly half all its exports to the US.
President Joe Biden’s administration has also sent Jeffrey Feltman, its Horn of Africa envoy and a veteran diplomat, to try and negotiate a deescalation to the conflict after Abiy declared a six-month state of emergency earlier this month. However, Feltman told Reuters, “We’re not getting much response, the military logic is still prevailing.”
Biden has threatened further sanctions against Ethiopian leadership, and a bipartisan group of senators has put forward legislation specifically targeting the actors prolonging and benefiting from the conflict, but as Politico’s Nahal Toosi points out, both sides of the conflict seem unwilling to budge.
“The problem is that you have multiple objects that heretofore have proven largely unmovable,” Toosi quotes an unnamed State Department official as saying about the situation. “It remains to be seen whether the shifting dynamic will cause at least one of those objects to show a little more flexibility.”
A new religion case forces the Supreme Court to confront the legacy of one of its cruelest decisions.
Dunn v. Ray (2019) is the kind of Supreme Court decision that a comic book supervillain might write. Widely denounced, even by prominent conservatives, when it was handed down, Ray held that a Muslim inmate in Alabama could be executed without his imam present — even though the state permitted Christian inmates to have a spiritual adviser present during their execution.
As Justice Elena Kagan wrote in dissent, one of the Constitution’s “clearest command[s]” is that “one religious denomination cannot be officially preferred over another.” But that’s exactly what the Court permitted in Ray.
After witnessing the bipartisan backlash to this decision — the conservative National Review’s David French labeled it a “grave violation of the First Amendment” — the Court eventually started to slink away from it. In Murphy v. Collier (2019), decided only a few months after Ray, the Court temporarily blocked the execution of a Buddhist inmate in Texas — unless that state “permits Murphy’s Buddhist spiritual adviser or another Buddhist reverend of the State’s choosing to accompany Murphy in the execution chamber during the execution.”
Most recently, in Dunn v. Smith (2021) the Court seemed to suggest that all people who are being executed, regardless of their faith, must be allowed to have a spiritual adviser present. Although there was no majority opinion in Smith, even some of the dissenting justices conceded that they’d been beaten. “It seems apparent that States that want to avoid months or years of litigation delays,” Justice Brett Kavanaugh wrote in a brief dissenting opinion, “should figure out a way to allow spiritual advisers into the execution room.”
And yet, while the Court’s treatment of Domineque Ray, the inmate in Ray, appears to be discredited, the Court has yet to tie up several loose ends left over from that decision, including questions about which procedural barriers can be erected between death row inmates and their spiritual advisers, and questions about what such advisers may do to comfort a dying prisoner.
These issues are front and center in Ramirez v. Collier, which will be argued before the justices on Tuesday. Texas permits John Ramirez, the death row inmate at the center of this case, to have his pastor present during his execution. But the state neither permits the pastor to lay hands on Ramirez nor to audibly pray over him.
The fundamental question in Ramirez, in other words, is whether a death row inmate is allowed to actually receive spiritual comfort during his execution — or whether Ramirez’s pastor must simply stand there, doing little to ease a dying man’s final moments.
Federal judges have a ghoulish duty. Whenever an execution draws nigh, judges are inundated with motions from capital defense lawyers trying to save their client’s life — or at least to ensure that the execution is performed as humanely as possible.
Because the Supreme Court is the nation’s court of last resort, many of these disputes eventually reach the justices. And so the justices must contend with a steady stream of emergency death penalty cases, often with only a few hours to review them.
The burden of spending years deciding who lives and who dies weighs differently on different justices. Some proclaim, as Justice Harry Blackmun did a few months before his retirement in 1994, that they “no longer shall tinker with the machinery of death.” Blackmun — and more recently, Justices Ruth Bader Ginsburg and Stephen Breyer — concluded, after decades of hearing last-minute capital appeals, that the death penalty is unconstitutional.
“Factual, legal, and moral error gives us a system that we know must wrongly kill some defendants,” Blackmun wrote.
In Ray, five conservative justices took the polar opposite approach. They attempted to quell the tidal wave of emergency death penalty motions by cutting off many inmates’ ability to file them in the first place. Domineque Ray’s error, these justices claimed, was that he waited too long to bring a lawsuit insisting that his imam be present at his execution.
It was a singularly unpersuasive claim — so unpersuasive that many observers accused the Court of offering a pretextual excuse to deny relief to a Muslim. Ray had filed his lawsuit just five days after a prison warden formally denied Ray’s request to have his imam comfort him during his execution. The Court’s explanation for its decision was quite literally unbelievable.
No doubt with the Court’s decision in Ray in mind, Texas spends the lion’s share of its brief in Ramirez accusing Ramirez of making minor procedural errors that supposedly doom his case. The brief spends an entire subsection, for example, arguing that Ramirez should lose because, when he filed a grievance asking to have his pastor present at his execution, he didn’t specifically state that the pastor should be allowed to speak.
Indeed, Texas spends only about a dozen pages of a 62-page brief arguing that its policy of forbidding a death row inmate’s spiritual adviser from speaking or touching the inmate can be justified under federal civil rights law.
The specific law at issue in this case is the Religious Land Use and Institutionalized Persons Act. It forbids prisons from imposing a “substantial burden” on an inmate’s faith, unless that burden is “in furtherance of a compelling governmental interest” and the prison uses “the least restrictive means of furthering that compelling governmental interest.”
That should be a difficult burden for Texas to carry in this case. Among other things, as Ramirez’s lawyers argue in his brief, until fairly recently, Texas permitted pastors to touch and speak to death row inmates while they were being executed. It even quotes from a book, authored by a former Texas criminal justice official, that recount past executions where chaplains placed their hands on the dying man’s knee. So it’s tough for Texas to argue that its current policy uses the “least restrictive” method of executing inmates, when it used to have a less restrictive policy.
To the extent that Texas even tries to defend its current policy, much of its defense rests on unlikely scenarios that could only occur if Texas’s own death chamber is run by rank incompetents. Texas argues, for example, that Ramirez’s pastor must not be allowed to touch him “in the event the inmate escaped his restraints, smuggled in a weapon, or otherwise became a threat in the chamber.” The fear is that “a spiritual adviser standing close enough to touch the inmate would be in harm’s way or in a position to assist the inmate.”
Texas, in other words, offers only a weak defense of its actual policy. It rests most of its argument on a hope that a majority of the justices will repeat their performance in Ray and rely on a procedural reason to deny Ramirez the relief he seeks.
The case for pessimism, if you are Ramirez’s lawyers, is fairly straightforward. In Smith, only the three liberals, plus conservative Justice Amy Coney Barrett, took a clear position in favor of religious freedom on death row. Roberts, Thomas, and Kavanaugh all dissented. That means that either Justice Samuel Alito or Justice Neil Gorsuch (or maybe both) silently voted in favor of the inmate in Smith.
But Alito and Gorsuch are both die-hard supporters of the death penalty. If you are a capital defense attorney and you are counting on their vote, you’re normally in trouble.
That said, there are also a few reasons for Ramirez’s lawyers to be optimistic that they can secure five votes.
One notable difference between Ramirez and Ray is that Ray arose on the Court’s shadow docket, a mix of emergency motions and other expedited requests that are typically decided in short order without full briefing or oral argument. Ramirez, by contrast, will be heard on the Court’s regular docket and will receive an oral argument.
That distinction matters because the Supreme Court ordinarily reserves full briefing and argument for cases that have either divided lower court judges or that involve unusually important questions of federal law. It’s unlikely that the Court would have agreed to hear Ramirez’s case if it thought that the correct answer turned on a minor procedural error that is unique to just this one case.
Although Smith did not produce a majority opinion, four justices — including Barrett — joined an opinion by Kagan that lays out a possible path forward in Ramirez. Kagan argued that states with restrictive policies governing spiritual advisers can simply adopt the practices used in other states. “In the last year, the Federal Government has conducted more than 10 executions attended by the prisoner’s clergy of choice,” Kagan noted — the implication being that states could copy the federal government’s procedures and do the same.
A state that fears a particular member of the clergy may present a security risk “can do a background check on the minister; it can interview him and his associates [and] it can seek a penalty-backed pledge that he will obey all rules,” Kagan wrote. But it can’t root its policy in speculative fears that a pastor may help an inmate stage a daring escape in the middle of their execution.
So while the outcome of this lawsuit is not entirely certain, Ramirez has good reason to hope that, in his final moments, he will receive spiritual comfort.
Key To The Mint and Dear Lady show out - Key To The Mint and Dear Lady showed out when the horses were exercised here on Monday (Nov. 8) morning.Sand track:600m: The Awakening (Yash) 41. Easy
Sankalp Gupta becomes India’s 71st GM - To achieve the GM title, a player has to secure three GM norms and cross the live rating of 2,500 Elo points
Mexico City Grand Prix | Verstappen wins to stretch F1 lead - Seven-times world champion Hamilton was second for Mercedes, holding off Red Bull’s Sergio Perez who became the first Mexican to stand on his home podium
T20 World Cup | Want to carry momentum into semifinals: Babar Azam - Scotland captain Kyle Coetzer hoped the team’s performance has inspired the youngsters to take up the sport back home.
Makhtoob well prepared to deliver in feature event - Trainer S.K. Sunderji’s ward Makhtoob is well tuned and may score over his rivals in the Telangana Cup, the feature event of the opening day of Hyder
No compromise on protection of environment, says Kerala Industries Minister - Govt. aims at bringing about a balance between development, eco conservation: Rajeeve
State to probe effect of quarrying on landslips - Discussion being held with Geological Survey of India on this: Minister
Noteban will be marked among worst policy blunders; PM Modi has destroyed economy: Congress - On this day in 2016, the Prime Minister Narendra Modi had announced the decision to ban currency notes of denomination ₹500 and ₹1,000
Will work to strengthen CPI(M), says Sudhakaran - Leader terms party’s censure a closed chapter
BJP demands KCR’s resignation & apology for ‘supporting’ China - Telangana CM had charged that India was losing its land at Arunachal Pradesh borders to China every day
Poland clarifies abortion law after protests over mother’s death - The guidance was issued after a pregnant woman’s death was blamed on Poland’s strict abortion laws.
Italian Mafia: ’Ndrangheta members convicted as Italy begins huge trial - More than 350 alleged mobsters will face court in the biggest mafia trial in decades.
Brexit: UK-EU trade deal could collapse over NI row, says Coveney - The UK is laying foundations to suspend parts of the NI Protocol, says the Irish foreign minister.
Palma de Mallorca: Fleeing passengers shut down busy Spanish airport - Palma de Mallorca Airport closes for four hours after 21 people run from a plane across the tarmac.
Three seriously injured in knife attack on train in Germany - A 27-year-old Syrian man is arrested after the attack on a high-speed train in Bavaria.
So what is “the metaverse,” exactly? - Will we all live in the metaverse soon? Or is the idea just Second Life redux? - link
Eleven tries to adjust to SoCal life in underwhelming Stranger Things S4 teaser - There are surfer dudes, roller rinks, mean girls, and some disastrous haircuts. - link
Here are the best “early Black Friday” deals we’re seeing this weekend [Updated] - Dealmaster has new lows on Amazon tablets, AirPods Max, gaming chairs, and more. - link
Tagalong robots follow you to learn where you go - Burro makes carts that help growers of trees and vineyards with harvests. - link
These parents built a school app. Then the city called the cops - Official app was a disaster, so knowledgeable parents built an open source alternative. - link
Just like yo momma
submitted by /u/Busterx8
[link] [comments]
When every American knows that America is the best country in the world.
submitted by /u/Busterx8
[link] [comments]
It’s too bad really…..
I had a blast working there!
submitted by /u/retek
[link] [comments]
The first economist says to the other “I’ll pay you $100 to eat that pile of shit.” The second economist takes the $100 and eats the pile of shit.
They continue walking until they come across a second pile of shit. The second economist turns to the first and says “I’ll pay you $100 to eat that pile of shit.” The first economist takes the $100 and eats a pile of shit.
Walking a little more, the first economist looks at the second and says, “You know, I gave you $100 to eat shit, then you gave me back the same $100 to eat shit. I can’t help but feel like we both just ate shit for nothing.”
“That’s not true”, responded the second economist. “We increased the GDP by $200!”
submitted by /u/Lokimonoxide
[link] [comments]
It was in the news recently that Putin was visiting a school in Moscow to promote the nations power on the world wide stage. The children were allowed to ask questions before lunch.
Little Alina speaks up and says to Putin…
“I have two questions”
“Why did Russia take Crimea?”
“And why are we sending troops to Ukraine?”
Putin responds “Good questions” but before he can say anything else the bell goes and the kids go for lunch.
When they come back to the classroom, there is room for more questions.
Natasha speaks up to Putin,
“I have four questions”
“Why did Russia take Crimea?”
“Why are we sending troops to Ukraine?”
“Why did the bell go off 20 minutes early?”
“And where is Alina?”
submitted by /u/MartynAndJasper
[link] [comments]